Sequential Projected Newton method for regularization of nonlinear least squares problems

نویسندگان

چکیده

We develop a computationally efficient algorithm for the automatic regularization of nonlinear inverse problems based on discrepancy principle. formulate problem as an equality constrained optimization problem, where constraint is given by least squares data fidelity term and expresses The objective function convex that incorporates some prior knowledge, such total variation function. Using Jacobian matrix forward model, we consider sequence quadratically can all be solved using Projected Newton method. show solution sub-problem results in descent direction exact merit This then used to describe formal line-search also slightly more heuristic approach simplifies allows inexact sub-problems. illustrate behavior number numerical experiments, with Talbot-Lau X-ray phase contrast imaging main application. experiments confirm sub-problems need not high accuracy early iterations make sufficient progress towards solution. In addition, proposed method able produce reconstructions similar quality compared other state-of-the-art approaches significant reduction computational time.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Gauss-Newton Methods for Nonlinear Least Squares Problems

The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an...

متن کامل

Newton-Krylov Type Algorithm for Solving Nonlinear Least Squares Problems

The minimization of a quadratic function within an ellipsoidal trust region is an important subproblem for many nonlinear programming algorithms. When the number of variables is large, one of the most widely used strategies is to project the original problem into a small dimensional subspace. In this paper, we introduce an algorithm for solving nonlinear least squares problems. This algorithm i...

متن کامل

Nonlinear least squares and regularization

I present and discuss some general ideas about iterative nonlinear output least-squares methods. The main result is that, if it is possible to do forward modeling on a physical problem in a way that permits the output (i.e., the predicted values of some physical parameter that could be measured) and the rst derivative of the same output with respect to the model parameters (whatever they may be...

متن کامل

A reduced Newton method for constrained linear least-squares problems

We propose an iterative method that solves constrained linear least-squares problems by formulating them as nonlinear systems of equations and applying the Newton scheme. The method reduces the size of the linear system to be solved at each iteration by considering only a subset of the unknown variables. Hence the linear system can be solved more efficiently. We prove that the method is locally...

متن کامل

Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis

We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of physics communications

سال: 2021

ISSN: ['2399-6528']

DOI: https://doi.org/10.1088/2399-6528/ac2371